Divergence based Learning Vector Quantization
نویسندگان
چکیده
We suggest the use of alternative distance measures for similarity based classification in Learning Vector Quantization. Divergences can be employed whenever the data consists of non-negative normalized features, which is the case for, e.g., spectral data or histograms. As examples, we derive gradient based training algorithms in the framework of Generalized Learning Vector Quantization based on the so-called CauchySchwarz divergence and a non-symmetric Renyi divergence. As a first test we apply the methods to two different biomedical data sets and compare with the use of standard Euclidean distance.
منابع مشابه
Divergence-based classification in learning vector quantization
We discuss the use of divergences in dissimilarity based classification. Divergences can be employed whenever vectorial data consists of non-negative, potentially normalized features. This is, for instance, the case in spectral data or histograms. In particular, we introduce and study Divergence Based Learning Vector Quantization (DLVQ). We derive cost function based DLVQ schemes for the family...
متن کاملMaximum Likelihood Topographic Map Formation
We introduce a new unsupervised learning algorithm for kernel-based topographic map formation of heteroscedastic gaussian mixtures that allows for a unified account of distortion error (vector quantization), log-likelihood, and Kullback-Leibler divergence.
متن کاملPrototype Based Classification Using Information Theoretic Learning
In this article we extend the (recently published) unsupervised information theoretic vector quantization approach based on the Cauchy–Schwarz-divergence for matching data and prototype densities to supervised learning and classification. In particular, first we generalize the unsupervised method to more general metrics instead of the Euclidean, as it was used in the original algorithm. Thereaf...
متن کاملThe Mathematics of Divergence Based Online Learning in Vector Quantization
We propose the utilization of divergences in gradient descent learning of supervised and unsupervised vector quantization as an alternative for the squared Euclidean distance. The approach is based on the determination of the Fréchet-derivatives for the divergences, wich can be immediately plugged into the online-learning rules.We provide themathematical foundation of the respective framework. ...
متن کاملVector Quantization by Minimizing Kullback-Leibler Divergence
This paper proposes a new method for vector quantization by minimizing the Kullback-Leibler Divergence between the class label distributions over the quantization inputs, which are original vectors, and the output, which is the quantization subsets of the vector set. In this way, the vector quantization output can keep as much information of the class label as possible. An objective function is...
متن کامل